Softmax GAN

نویسنده

  • Min Lin
چکیده

Softmax GAN is a novel variant of Generative Adversarial Network (GAN). The key idea of Softmax GAN is to replace the classification loss in the original GAN with a softmax cross-entropy loss in the sample space of one single batch. In the adversarial learning of N real training samples and M generated samples, the target of discriminator training is to distribute all the probability mass to the real samples, each with probability 1 M , and distribute zero probability to generated data. In the generator training phase, the target is to assign equal probability to all data points in the batch, each with probability 1 M+N . While the original GAN is closely related to Noise Contrastive Estimation (NCE), we show that Softmax GAN is the Importance Sampling version of GAN. We futher demonstrate with experiments that this simple change stabilizes GAN training.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

GANS for Sequences of Discrete Elements with the Gumbel-softmax Distribution

Generative Adversarial Networks (GAN) have limitations when the goal is to generate sequences of discrete elements. The reason for this is that samples from a distribution on discrete objects such as the multinomial are not differentiable with respect to the distribution parameters. This problem can be avoided by using the Gumbel-softmax distribution, which is a continuous approximation to a mu...

متن کامل

Variational Autoencoder for Deep Learning of Images, Labels and Captions: Supplementary Material

Table 1: Semi-supervised classification accuracy (%) on the validation set of ImageNet 2012. Proportion 1% 5% 10% 20% 30% 40% top-1 AlexNet 0.1± 0.01 11.5 ± 0.72 19.8 ± 0.71 38.6 ± 0.31 43.23 ± 0.28 45.85 ± 0.23 GoogeLeNet 4.75± 0.58 22.13± 1.14 32.18± 0.80 42.83± 0.28 49.61± 0.11 51.90 ± 0.20 BSVM (ours) 43.98± 1.15 47.36± 0.91 48.41± 0.76 51.51± 0.28 54.14± 0.12 57.34± 0.18 Softmax (ours) 42....

متن کامل

Feature Generating Networks for Zero-Shot Learning

Suffering from the extreme training data imbalance between seen and unseen classes, most of existing state-of-theart approaches fail to achieve satisfactory results for the challenging generalized zero-shot learning task. To circumvent the need for labeled examples of unseen classes, we propose a novel generative adversarial network (GAN) that synthesizes CNN features conditioned on class-level...

متن کامل

MTGAN: Speaker Verification through Multitasking Triplet Generative Adversarial Networks

In this paper, we propose an enhanced triplet method that improves the encoding process of embeddings by jointly utilizing generative adversarial mechanism and multitasking optimization. We extend our triplet encoder with Generative Adversarial Networks (GANs) and softmax loss function. GAN is introduced for increasing the generality and diversity of samples, while softmax is for reinforcing fe...

متن کامل

Soft-Margin Softmax for Deep Classification

In deep classification, the softmax loss (Softmax) is arguably one of the most commonly used components to train deep convolutional neural networks (CNNs). However, such a widely used loss is limited due to its lack of encouraging the discriminability of features. Recently, the large-margin softmax loss (L-Softmax [14]) is proposed to explicitly enhance the feature discrimination, with hard mar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1704.06191  شماره 

صفحات  -

تاریخ انتشار 2017